The object of inquiry here is the school - how has each school’s number of reported practices changed?

Number of practices selected over time

We’ll begin by including schools that participated in 2, 3, and 4 surveys, and looking at how many practices they selected each year in total. (The blue line shows the average, and the points for each school are connected by light gray lines.)

The above reflects the number of practices available to select, it is effected by what practices schools indicate, but also the number of choices they were given each year. Below, we recreate the same graph but limit it to practices that were included in all 4 surveys. Comparing the two, we see above decreases in the averages, but this is mostly due to culling of tags–below we can see that the averages increase over time.

The table below shows the blue-line averages of the above graph (with 4+-year practices). There we can see that while the average number of selected practices by about 6-7 practices over the entire 2019-2024 period (focusing mostly on the schools responding in 3 or more years).

Adds and Drops

Which schools are adding and dropping practices? The table below shows, for each school responding to multiple surveys, how many practices they added, dropped, and waffled on (both added and dropped). While the average may be 6 or 7 practices added overall, plenty of schools have added 20+ practices.

I recommend digging deeper into the waffling, especially the few schools that have a lot of waffling. If a school indicates a practice was dropped and then re-added, there are possibilities of data compilation error, survey response error, or a COVID blip, and we may not want to read too much into any of those.

I’d also be interested in the schools that appear to have made drastic changes–if those changes can be corroborated and explained with some interviews or even just another survey response, it may be worth targeting them for some outreach.

Focus on the Wafflers

Purdue Polytechnic and Juab HS waffled a lot. Here’s the raw data, including all practices that they selected in at least 1 year.

Here’s a summary of the patterns, where 1 indicates selected and 0 indicates not selected in each survey, respectively. E.g., “1,0,1,1” means the practice was selected in 2019, not selected in 2021, and then selected again in 2022 and 2023. n is the number of practices for which this pattern was observed. We can see that Juab HS has 11 tags with the 1,0,1,1 pattern, meaning there are 11 practices that Juab indicated in all survey years except 2021.

Repeating the Above for All 4+Year Schools

Practices added and dropped

The next obvious question is which practices are schools adding and dropping? Again, we look at the tags present on 4+ surveys and see how many multiply-responding schools added or dropped them. The “waffled” column counts schools that both added and dropped the practice

The same information from the table above is presented in the graph below. Unsurprisingly, far more practices were added that dropped, with 8 practices showing a net decrease (was only 4 until we added 2024!).

I am surprised by how many drops there are - 10 practices are dropped by 20 or more schools, and the 3 most dropped practices are dropped by 40+ schools. Those 3 are also the highest on the “waffle” count and should be investigated more deeply before we jump to conclusions. I can imagine a COVID-effect or an error in combining the data.

Broadening: Practices in 2+ years

Repeating the above analyses, but expanding the scope to include practices on the survey in 2024 and at least 1 other year.

Note that a practice included in only two years couldn’t possibly be “waffled”, so there will be relatively fewer waffles for these 2-year tags, and even 3-year tags have less opportunity to waffle.

Tag change plots

The practices_services_learning outlier needs some investigation, the table shows 81 adds and 57 drops, including 30 waffles.

Little bit of modeling

Here we’ll build a model for how volatile/dynamic school’s tagging practices are.

We’ll look within schools. For all schools with multiple responses, we’ll make a data point out of each consecutive survey response, and for the tags that were used in both of those response years, we will look at the number of practices changed (number added plus number dropped). We will then model count of practice changes using school demography and ecology as predictors.

The time periods are important too, and we know that the pandemic had a large effect on many practices. To try to capture the relevant time details without overcomplicating the model, we will classify each time transition into one of four cases:

  1. to_covid for transitions from 2019 to 2020
  2. from_covid for transitions from 2020 to a later year
  3. post_covid for transitions that start after 2020
  4. skip_covid for transitions from 2019 to a post-2020 year

For now, we will not put the number of years in the transition as a numeric variable, as that will be mostly dependent on the transition types, though with more years of data we may want to include that (and perhaps keep things focused post-pandemic).

Doesn’t really seems like there’s a lot interesting here. When we switch from “modified” to “added”, the transition_to_covid variable impact shoots up (relatively), so schools added a bunch of practices in COVID.

Our two outliers from above, Juab and Purdue Polytechnic, could be the reason Rural looks significant. Below, we re-run the models above omitting Juab and Purdue, but the Rural effect is still strong.

Next we’ll try looking only post-COVID:

There’s not a lot going on here, I think it’s time to call it on the volatility modeling.

Consistency and Carrying On

When a school completes a survey for the 2nd time (or more), let’s look at the overlap in tag selections the make. For each 2nd+ survey response, we’ll calculate a consistency fraction - among the tags present in both survey years, how many changes did the school make (adds or drops) divided by the number of tags selected in the first year.